Joeyonng
  • Notebook
  • Pages
  • About
  • Backyard
  1. Linear Algebra
  2. 9  Orthogonal Complement and Decomposition
  • Welcome
  • Notations and Facts
  • Linear Algebra
    • 1  Fields and Spaces
    • 2  Vectors and Matrices
    • 3  Span and Linear Independence
    • 4  Basis and Dimension
    • 5  Linear Map and Rank
    • 6  Inner Product and Norm
    • 7  Orthogonality and Unitary Matrix
    • 8  Complementary Subspaces and Projection
    • 9  Orthogonal Complement and Decomposition
    • 10  SVD and Pseudoinverse
    • 11  Orthogonal and Affine Projection
    • 12  Determinants and Eigensystems
    • 13  Similarity and Diagonalization
    • 14  Normal and Hermitian Matrices
    • 15  Positive Definite Matrices
  • Calculus
    • 16  Derivatives
    • 17  Chain rule
  • Probability and Statistics
    • 18  Probability
    • 19  Random Variables
    • 20  Expectation
    • 21  Common Distributions
    • 22  Moment Generating Function
    • 23  Concentration Inequalities I
    • 24  Convergence
    • 25  Limit Theorems
    • 26  Maximum Likelihood Estimation
    • 27  Bayesian Estimation
    • 28  Expectation-maximization
    • 29  Concentration Inequalities II
  • Learning Theory
    • 30  Statistical Learning
    • 31  Bayesian Classifier
    • 32  Effective Class Size
    • 33  Empirical Risk Minimization
    • 34  Uniform Convergence
    • 35  PAC Learning
    • 36  Rademacher Complexity
  • Machine Learning
    • 37  Linear Discriminant
    • 38  Perceptron
    • 39  Logistic Regression
    • 40  Multi-layer Perceptron
    • 41  Boosting
    • 42  Support Vector Machine
    • 43  Decision Tree
    • 44  Principle Component Analysis
  • Deep Learning
    • 45  Transformer

Table of contents

  • Orthogonal Complement
  • Orthogonal Complementary Subspaces
    • Properties of orthogonal complementary subspaces
  • Orthogonal Decomposition
  1. Linear Algebra
  2. 9  Orthogonal Complement and Decomposition

9  Orthogonal Complement and Decomposition

Orthogonal Complement

For a subset \mathcal{M} of an inner-product space \mathcal{V}, the orthogonal complement \mathcal{M}^{\perp} of \mathcal{M} is defined to be the set of all vectors in \mathcal{V} that are orthogonal to every vector in \mathcal{M}

\mathcal{M}^{\perp} = \left\{ x \in \mathcal{V} \mid \langle x, m \rangle = 0, \forall m \in \mathcal{M} \right\}.

The set \mathcal{M}^{\perp} is a subspace even if \mathcal{M} is not.

Proof

The set \mathcal{M}^{\perp} is closed under addition and multiplication.

Suppose x, y \in \mathcal{M}^{\perp}, which means

\langle x, m \rangle = 0, \forall m \in \mathcal{M},

\langle y, m \rangle = 0, \forall m \in \mathcal{M}.

Closed under addition:

\langle x + y, m \rangle = \langle x, m \rangle + \langle y, m \rangle = 0, \forall m \in \mathcal{M}.

Closed under multiplication:

\langle \alpha x, m \rangle = \alpha \langle x, m \rangle = 0, \forall m \in \mathcal{M}, \forall \alpha \in \mathbb{F}.

Orthogonal Complementary Subspaces

When \mathcal{M} is a subspace of a finite-dimensional inner-product space \mathcal{V}, \mathcal{M} and \mathcal{M}^{\perp} are complementary subspaces in \mathcal{V}

\mathcal{V} = \mathcal{M} \oplus \mathcal{M}^{\perp}.

Proof

TODO

Properties of orthogonal complementary subspaces

Suppose \mathcal{M} is a subspace of an n-dimensional inner-product space \mathcal{V}.

  • \text{dim} (\mathcal{M}) + \text{dim} (\mathcal{M}^{\perp}) = n

    Proof

    Since \mathcal{M}^{T} and \mathcal{M} are complementary subspaces, the property of complementary subspace shows that

    \mathcal{B}_{\mathcal{M}} \cup \mathcal{B}_{\mathcal{M}^{\perp}} = \mathcal{B}_{\mathcal{V}}

    and

    \mathcal{B}_{\mathcal{M}} \cap \mathcal{B}_{\mathcal{M}^{\perp}} = \emptyset.

    Thus,

    \begin{aligned} \lvert \mathcal{B}_{\mathcal{M}} \rvert + \lvert \mathcal{B}_{\mathcal{M}^{\perp}} \rvert & = \lvert \mathcal{B}_{\mathcal{V}} \rvert \\ \text{dim} (\mathcal{M}) + \text{dim} (\mathcal{M}^{\perp}) & = n \end{aligned}

  • \mathcal{M}^{\perp^{\perp}} = \mathcal{M}.

    Proof

    Since \mathcal{M}^{\perp^{\perp}} \subseteq \mathcal{V},

    x \in \mathcal{M}^{\perp^{\perp}} \Rightarrow x \in \mathcal{V}.

    Since \mathcal{M} and \mathcal{M}^{\perp} are complementary subspaces, every x \in \mathcal{V} can be uniquely represented by m \in \mathcal{M} and n \in \mathcal{M}^{\perp}

    x = m + n.

    Since \mathcal{M}^{\perp^{\perp}} \perp \mathcal{M}^{\perp}, by definition

    \begin{aligned} 0 & = \langle x, n \rangle \\ & = \langle m + n, n \rangle \\ & = \langle m, n \rangle + \langle n, n \rangle \\ & = \langle n, n \rangle & [\mathcal{M} \perp \mathcal{M}^{\perp} \Rightarrow \langle m, n \rangle = 0]. \end{aligned}

    By the definition of the inner product,

    \langle n, n \rangle = 0 \Rightarrow n = 0.

    Thus, for each x \in \mathcal{M}^{\perp^{\perp}}

    x = m \in \mathcal{M} \Rightarrow \mathcal{M}^{\perp^{\perp}} \subseteq \mathcal{M}.

    By the property

    \text{dim} (\mathcal{M}) = n - \text{dim} (\mathcal{M}^{\perp}) = \text{dim} (\mathcal{M}^{\perp^{\perp}}).

    Then by the property,

    \mathcal{M} = \mathcal{M}^{\perp^{\perp}}.

Orthogonal Decomposition

For every \mathbf{A} \in \mathbb{R}^{m \times n},

R (\mathbf{A}) \perp N (\mathbf{A}^{H}),

N (\mathbf{A}) \perp R (\mathbf{A}^{H}),

which means that every matrix \mathbf{A} \in \mathbb{R}^{m \times n} produces an orthogonal decomposition of \mathbb{R}^{m} and \mathbb{R}^{n} in the sense that

\mathbb{R}^{m} = R (\mathbf{A}) \oplus R (\mathbf{A})^{\perp} = R (\mathbf{A}) \oplus N (\mathbf{A}^{H}),

\mathbb{R}^{n} = N (\mathbf{A}) \oplus N (\mathbf{A})^{\perp} = N (\mathbf{A}) \oplus R (\mathbf{A}^{H}).

Proof

Suppose \mathbf{A} \in \mathbb{R}^{m \times n} and \mathbf{x} \in R (\mathbf{A})^{\perp}.

Consider every \mathbf{y} \in \mathbb{R}^{n}, we have

\begin{aligned} \\ \mathbf{x} \in R (\mathbf{A})^{\perp} \iff \langle \mathbf{A} \mathbf{y}, \mathbf{x} \rangle & = 0 \\ (\mathbf{A} \mathbf{y})^{H} \mathbf{x} & = 0 \\ \mathbf{y}^{H} \mathbf{A}^{H} \mathbf{x} & = 0 \\ \langle \mathbf{y}, \mathbf{A}^{H} \mathbf{x} \rangle & = 0. \\ \end{aligned}

According to the property of inner product,

\langle \mathbf{y}, \mathbf{A}^{H} \mathbf{x} \rangle = 0 \iff \mathbf{A}^{H} \mathbf{x} = 0 \iff \mathbf{x} \in N (\mathbf{A}^{H})

Thus,

R (\mathbf{A})^{\perp} = N (\mathbf{A}^{H}).

Replacing \mathbf{A} with \mathbf{A}^{H} above, we can get

\begin{aligned} R (\mathbf{A}^{H})^{\perp} & = N (\mathbf{A}). \\ R (\mathbf{A}^{H}) & = N (\mathbf{A})^{\perp}. \\ \end{aligned}

Since R (\mathbf{A}) is a subspace in \mathbb{R}^{m} and N (\mathbf{A}) is a subspace in \mathbb{R}^{n},

R (\mathbf{A}) \oplus R (\mathbf{A})^{\perp} = R (\mathbf{A}) \oplus N (\mathbf{A}^{H}) = \mathbb{R}^{m},

N (\mathbf{A}) \oplus N (\mathbf{A})^{\perp} = N (\mathbf{A}) \oplus R (\mathbf{A}^{H}) = \mathbb{R}^{n}.

8  Complementary Subspaces and Projection
10  SVD and Pseudoinverse